The big meh

Subscribe Now Choose a package that suits your preferences.
Start Free Account Get access to 7 premium stories every month for FREE!
Already a Subscriber? Current print subscriber? Activate your complimentary Digital account.

Remember Douglas Adams’ 1979 novel “The Hitchhiker’s Guide to the Galaxy”? It began with some technology snark, dismissing Earth as a planet whose life-forms “are so amazingly primitive that they still think digital watches are a pretty neat idea.” But that was then, in the early stages of the information technology revolution.

Remember Douglas Adams’ 1979 novel “The Hitchhiker’s Guide to the Galaxy”? It began with some technology snark, dismissing Earth as a planet whose life-forms “are so amazingly primitive that they still think digital watches are a pretty neat idea.” But that was then, in the early stages of the information technology revolution.

Since then, we’ve moved on to much more significant things, so much so that the big technology idea of 2015, so far, is a digital watch. But this one tells you to stand up if you’ve been sitting too long!

OK, I’m snarking, too. But there is a real question here. Everyone knows we live in an era of incredibly rapid technological change, which is changing everything. But what if what everyone knows is wrong? And I’m not being wildly contrarian here. A growing number of economists, looking at the data on productivity and incomes, are wondering if the technological revolution has been greatly overhyped — and some technologists share their concern.

We’ve been here before. “The Hitchhiker’s Guide” was published during the era of the “productivity paradox,” a two-decade-long period during which technology seemed to be advancing rapidly — personal computing, cellphones, local area networks and the early stages of the Internet — yet economic growth was sluggish and incomes stagnant. Many hypotheses were advanced to explain that paradox, with the most popular probably being that inventing a technology and learning to use it effectively aren’t the same thing. Give it time, said economic historians, and computers eventually will deliver the goods (and services).

This optimism seemed vindicated when productivity growth finally took off circa 1995. Progress was back — and so was America, which seemed to be at the cutting edge of the revolution.

But a funny thing happened on the way to the techno-revolution. We did not, it turned out, get a sustained return to rapid economic progress. Instead, it was more of a one-time spurt, which sputtered out about a decade ago. Since then, we’ve been living in an era of iPhones and iPads and iDontKnows, but even if you adjust for the effects of financial crisis, growth and trends in income have reverted to the sluggishness that characterized the 1970s and 1980s.

In other words, at this point, the whole digital era, spanning more than four decades, is looking like a disappointment. New technologies have yielded great headlines, but modest economic results. Why?

One possibility is that the numbers are missing the reality, especially the benefits of new products and services. I get a lot of pleasure from technology that lets me watch streamed performances by my favorite musicians, but that doesn’t get counted in GDP. Still, new technology is supposed to serve businesses as well as consumers, and should be boosting the production of traditional as well as new goods. The big productivity gains of the period from 1995-2005 came largely in things such as inventory control, and showed up as much or more in nontechnology businesses such as retail as in high-technology industries themselves. Nothing like that is happening now.

Another possibility is new technologies are more fun than fundamental. Peter Thiel, one of the founders of PayPal, famously remarked that we wanted flying cars but got 140 characters instead. And he’s not alone in suggesting information technology that excites the Twittering classes might not be a big deal for the economy as a whole.

So, what do I think is going on with technology? The answer is that I don’t know — but neither does anyone else. Maybe my friends at Google are right, and Big Data soon will transform everything. Maybe 3-D printing will bring the information revolution into the material world. Or maybe we’re on track for another big meh.

What I’m pretty sure about, however, is that we ought to scale back the hype.

You see, writing and talking breathlessly about how technology changes everything might seem harmless, but, in practice, it acts as a distraction from more mundane issues — and an excuse for handling those issues badly. If you go back to the 1930s, you find many influential people saying the same kinds of things such people say nowadays: This isn’t really about the business cycle, never mind debates about macroeconomic policy; it’s about radical technological change and a workforce that lacks the skills to deal with the new era.

And then, thanks to World War II, we finally got the demand boost we needed, and all those supposedly unqualified workers — not to mention Rosie the Riveter — turned out to be quite useful in the modern economy, if given a chance.

Of course, there I go, invoking history. Don’t I understand everything is different now? Well, I understand why people like to say that. But that doesn’t make it true.

Paul Krugman is a syndicated columnist who writes for the New York Times News Service.